翻訳と辞書
Words near each other
・ Quantum Markov chain
・ Quantum master equation
・ Quantum mechanical Bell test prediction
・ Quantum mechanical scattering of photon and nucleus
・ Quantum mechanics
・ Quantum mechanics of time travel
・ Quantum meruit
・ Quantum metamaterials
・ Quantum metrology
・ Quantum mind
・ Quantum mirage
・ Quantum Mistake
・ Quantum money
・ Quantum Monte Carlo
・ Quantum Moves
Quantum mutual information
・ Quantum mysticism
・ Quantum nanoscience
・ Quantum network
・ Quantum neural network
・ Quantum no-deleting theorem
・ Quantum noise
・ Quantum non-equilibrium
・ Quantum nondemolition measurement
・ Quantum nonlocality
・ Quantum number
・ Quantum of Solace
・ Quantum of Solace (disambiguation)
・ Quantum of Solace (soundtrack)
・ Quantum on the Bay


Dictionary Lists
翻訳と辞書 辞書検索 [ 開発暫定版 ]
スポンサード リンク

Quantum mutual information : ウィキペディア英語版
Quantum mutual information

In quantum information theory, quantum mutual information, or von Neumann mutual information, after John von Neumann, is a measure of correlation between subsystems of quantum state. It is the quantum mechanical analog of Shannon mutual information.
== Motivation ==

For simplicity, it will be assumed that all objects in the article are finite-dimensional.
The definition of quantum mutual entropy is motivated by the classical case. For a probability distribution of two variables ''p''(''x'', ''y''), the two marginal distributions are
:p(x) = \sum_ p(x,y)\; , \; p(y) = \sum_ p(x,y).
The classical mutual information ''I''(''X'', ''Y'') is defined by
:\;I(X,Y) = S(p(x)) + S(p(y)) - S(p(x,y))
where ''S''(''q'') denotes the Shannon entropy of the probability distribution ''q''.
One can calculate directly
:\; S(p(x)) + S(p(y))
:\; = -(\sum_x p_x \log p(x) + \sum_y p_y \log p(y))

:
\; = -(\sum_x \; ( \sum_ p(x,y') \log \sum_ p(x,y') ) + \sum_y ( \sum_ p(x',y) \log \sum_ p(x',y)))
:\; = -(\sum_ p(x,y) (\log \sum_ p(x,y') + \log \sum_ p(x',y)))
:\; = -\sum_ p(x,y) \log p(x) p(y) .
So the mutual information is
:I(X,Y) = \sum_ p(x,y) \log \frac.
But this is precisely the relative entropy between ''p''(''x'', ''y'') and ''p''(''x'')''p''(''y''). In other words, if we assume the two variables ''x'' and ''y'' to be uncorrelated, mutual information is the ''discrepancy in uncertainty'' resulting from this (possibly erroneous) assumption.
It follows from the property of relative entropy that ''I''(''X'',''Y'') ≥ 0 and equality holds if and only if ''p''(''x'', ''y'') = ''p''(''x'')''p''(''y'').

抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)
ウィキペディアで「Quantum mutual information」の詳細全文を読む



スポンサード リンク
翻訳と辞書 : 翻訳のためのインターネットリソース

Copyright(C) kotoba.ne.jp 1997-2016. All Rights Reserved.